High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks

نویسندگان

چکیده

In “High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks,” Liu, Ye, Lee study a model fitting problem where there are much fewer data than dimensions. Of their particular focus the scenarios commonly imposed sparsity assumption is relaxed, usual condition of restricted strong convexity absent. The results show that generalization performance can still be ensured in such settings, even if dimensions grow exponentially. authors further sample complexities high-dimensional nonsmooth estimation neural networks. Particularly for latter, it shown that, explicit regularization, network provably generalizable, size only poly-logarithmic number parameters.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

LDMNet: Low Dimensional Manifold Regularized Neural Networks

Deep neural networks have proved very successful on archetypal tasks for which large training sets are available, but when the training data are scarce, their performance suffers from overfitting. Many existing methods of reducing overfitting are data-independent, and their efficacy is often limited when the training set is very small. Data-dependent regularizations are mostly motivated by the ...

متن کامل

Regularized Estimation of High-dimensional Covariance Matrices

Regularized Estimation of High-dimensional Covariance Matrices

متن کامل

Learning Structured Sparsity in Deep Neural Networks

High demand for computation resources severely hinders deployment of large-scale Deep Neural Networks (DNN) in resource constrained devices. In this work, we propose a Structured Sparsity Learning (SSL) method to regularize the structures (i.e., filters, channels, filter shapes, and layer depth) of DNNs. SSL can: (1) learn a compact structure from a bigger DNN to reduce computation cost; (2) ob...

متن کامل

Learning with Sparsity: Structures, Optimization and Applications

The development of modern information technology has enabled collecting data of unprecedented size and complexity. Examples include web text data, microarray & proteomics, and data from scientific domains (e.g., meteorology). To learn from these high dimensional and complex data, traditional machine learning techniques often suffer from the curse of dimensionality and unaffordable computational...

متن کامل

Regularized Learning with Feature Networks

REGULARIZED LEARNING WITH FEATURE NETWORKS S. Ted Sandler Lyle H. Ungar In this thesis, we present Regularized Learning with Feature Networks (RLFN), an approach for regularizing the feature weights of a regression model when one has prior beliefs about which regression coefficients are similar in value. These similarities are specified as a feature network whose semantic interpretation is that...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Operations Research

سال: 2022

ISSN: ['1526-5463', '0030-364X']

DOI: https://doi.org/10.1287/opre.2021.2217